AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Japanese Whole Word Masking

# Japanese Whole Word Masking

Bert Base Japanese Whole Word Masking
BERT model pretrained on Japanese text using IPA dictionary tokenization and whole word masking techniques
Large Language Model Japanese
B
tohoku-nlp
113.33k
65
Bert Large Japanese
BERT large model pretrained on Japanese Wikipedia, utilizing Unidic dictionary tokenization and whole word masking strategy
Large Language Model Japanese
B
tohoku-nlp
1,272
9
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase